Research Article | Open Access
Volume 2025 |Article ID 100021 | https://doi.org/10.1016/j.plaphe.2025.100021

Segmenting vegetation from UAV images via spectral reconstruction in complex field environments

Zhixun Pei,1 Xingcai Wu,1 Xue Wu,2 Yuanyuan Xiao ,1 Peijia Yu ,1 Zhenran Gao,4 Qi Wang ,1 and Wei Guo3

1State Key Laboratory of Public Big Data, College of Computer Science and Technology, Guizhou University, Guiyang, 550025, China
2State Key Laboratory of Green Pesticide, Guizhou University, Guiyang 550025, China
3Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo, 188-0002, Japan
4New Rural Development Research Institute, Guizhou University, Guiyang, 550025, China

Received 
12 Jun 2024
Accepted 
16 Feb 2025
Published
01 Mar 2025

Abstract

Segmentation of vegetation remote sensing images can minimize the interference of background, thus achieving efficient monitoring and analysis for vegetation information. The segmentation of vegetation poses a significant challenge due to the inherently complex environmental conditions. Currently, there is a growing trend of using spectral sensing combined with deep learning for field vegetation segmentation to cope with complex environments. However, two major constraints remain: the high cost of equipment required for field spectral data collection; the availability of field datasets is limited and data annotation is time-consuming and labor-intensive. To address these challenges, we propose a weakly supervised approach for field vegetation segmentation by using spectral reconstruction (SR) techniques as the foundation and drawing on the theory of vegetation index (VI). Specifically, to reduce the cost of data acquisition, we propose SRCNet and SRANet based on convolution and attention structure to reconstruct multispectral images of fields, respectively. Then, borrowing from the VI principle, we aggregate the reconstructed data to establish the connection of spectral bands, obtaining more salient vegetation information. Finally, we employ the adaptation strategy to segment the fused feature map using a weakly supervised method, which does not require manual labeling to obtain a field vegetation segmentation result. Our segmentation method can achieve a Mean Intersection over Union (MIoU) of 0.853 on real field datasets, which outperforms the existing methods. In addition, we have open-sourced a dataset of unmanned aerial vehicle (UAV) RGB-multispectral images, comprising 2358 pairs of samples, to improve the richness of remote sensing agricultural data. The code and data are available at https://github.com/GZU-SAMLab/VegSegment_SR, and http://sr-seg.samlab.cn/.

© 2019-2023   Plant Phenomics. All rights Reserved.  ISSN 2643-6515.

Back to top